Big Tech Has Banned Trump. Now What?

2021-01-16

00:00 / 00:00
复读宝 RABC v8.0beta 复读机按钮使用说明
播放/暂停
停止
播放时:倒退3秒/复读时:回退AB段
播放时:快进3秒/复读时:前进AB段
拖动:改变速度/点击:恢复正常速度1.0
拖动改变复读暂停时间
点击:复读最近5秒/拖动:改变复读次数
设置A点
设置B点
取消复读并清除AB点
播放一行
停止播放
后退一行
前进一行
复读一行
复读多行
变速复读一行
变速复读多行
LRC
TXT
大字
小字
滚动
全页
1
  • As the world accepts a Twitter without @realdonaldtrump, the big question is: "Now what?"
  • 2
  • Major technology companies have long been accused of giving President Donald Trump special treatment that other users did not receive.
  • 3
  • Now, tech companies have banned Trump from their platforms after a mob led by his supporters attacked the U.S. Capitol on January 6.
  • 4
  • Trump was blocked from Twitter, Facebook, Snapchat and other social media platforms.
  • 5
  • In many ways, removing the president was the easy part. But what happens now?
  • 6
  • Will tech companies hold other world leaders to the same level of behavior?
  • 7
  • Will they go further into deciding what is and is not permitted on their platforms, perhaps angering many of their users?
  • 8
  • Will all this cause additional online divisions that will push people with extreme ideas onto secret platforms?
  • 9
  • Although they've long tried to remain neutral, Facebook, Twitter and other platforms are slowly finding that they can play an active part in shaping the modern world.
  • 10
  • Their services are used by many angry groups as well as people pushing misinformation about science, politics and medicine.
  • 11
  • The companies are moving from defending "free-speech absolutism, towards an understanding of speech moderation as a matter of public health," said media professor Ethan Zuckerman of the University of Massachusetts-Amherst.
  • 12
  • None of this can be fixed quickly, and banning a president with only a few more days in office is not the answer.
  • 13
  • But there are ways to be more effective.
  • 14
  • When the 26-minute video "Pl andemic" suddenly appeared on the internet, it received millions of views in just a few days.
  • 15
  • It was filled with untrue information that pointed to a worldwide COVID-19 conspiracy.
  • 16
  • Facebook, Twitter and YouTube removed it only after the video had received millions of views.
  • 17
  • But the companies were ready for part two of the video.
  • 18
  • When it appeared, it was removed immediately and received very little attention.
  • 19
  • "Sharing disinformation about COVID is a danger because it makes it harder for us to fight the disease," Zuckerman said.
  • 20
  • He added that "sharing disinformation about voting is an attack on our democracy."
  • 21
  • It has been easier for tech companies to act on matters of public health than on politics.
  • 22
  • Corporate reactions to Trump and his supporters have led to angry cries of censorship.
  • 23
  • Such actions even drew criticism from European leaders such as German Chancellor Angela Merkel, and she has little love for Trump.
  • 24
  • Merkel's spokesman, Steffen Seibert, said freedom of opinion is one of our most basic rights.
  • 25
  • He told reporters that such a right can only be removed or changed by governments, not by "a decision by the management of social media platforms."
  • 26
  • That may be possible in Europe, but it is much more complex in the U.S., where the First Amendment of the U.S. Constitution protects freedom of expression from government rules.
  • 27
  • However, it does not protect freedom of expression from corporate rules on privately-owned, communication platforms.
  • 28
  • Governments, of course, remain free to regulate tech companies.
  • 29
  • Over the past year, Trump, other Republicans and some Democrats have called for the removal of a 1996 law known as Section 230.
  • 30
  • The law protects social media platforms from being sued for a lot of money by anyone who feels wronged by something someone else has posted.
  • 31
  • Still, few are happy with the often slow reactions of companies like Twitter and Facebook to events like the U.S. Capitol attack, other violent events or live-streamed shootings.
  • 32
  • Sarita Schoenebeck is a University of Michigan professor who studies online harassment.
  • 33
  • She said it might be time for the platforms to reexamine how they react to problematic material.
  • 34
  • Until recently, tech companies have looked only at problematic material on its own.
  • 35
  • They have not thought about "the broader social and cultural" effect, she said.
  • 36
  • She added that companies should look at democratic ideals, community governance and platform rules to "shape behavior."
  • 37
  • Jared Schroeder is an expert on social media and the First Amendment at Southern Methodist University.
  • 38
  • He thinks the Trump bans will push supporters to more secretive platforms where "they can organize and communicate."
  • 39
  • "The bans have taken away the best tools for organizing people and for Trump to speak to the largest audiences, but these are by no means the only tools," Schroeder said.
  • 40
  • I'm Susan Shand.
  • 1
  • As the world accepts a Twitter without @realdonaldtrump, the big question is: "Now what?"
  • 2
  • Major technology companies have long been accused of giving President Donald Trump special treatment that other users did not receive. Now, tech companies have banned Trump from their platforms after a mob led by his supporters attacked the U.S. Capitol on January 6.
  • 3
  • Trump was blocked from Twitter, Facebook, Snapchat and other social media platforms. In many ways, removing the president was the easy part. But what happens now?
  • 4
  • Will tech companies hold other world leaders to the same level of behavior? Will they go further into deciding what is and is not permitted on their platforms, perhaps angering many of their users? Will all this cause additional online divisions that will push people with extreme ideas onto secret platforms?
  • 5
  • Although they've long tried to remain neutral, Facebook, Twitter and other platforms are slowly finding that they can play an active part in shaping the modern world. Their services are used by many angry groups as well as people pushing misinformation about science, politics and medicine.
  • 6
  • The companies are moving from defending "free-speech absolutism, towards an understanding of speech moderation as a matter of public health," said media professor Ethan Zuckerman of the University of Massachusetts-Amherst.
  • 7
  • None of this can be fixed quickly, and banning a president with only a few more days in office is not the answer.
  • 8
  • But there are ways to be more effective.
  • 9
  • When the 26-minute video "Pl andemic" suddenly appeared on the internet, it received millions of views in just a few days. It was filled with untrue information that pointed to a worldwide COVID-19 conspiracy. Facebook, Twitter and YouTube removed it only after the video had received millions of views. But the companies were ready for part two of the video. When it appeared, it was removed immediately and received very little attention.
  • 10
  • "Sharing disinformation about COVID is a danger because it makes it harder for us to fight the disease," Zuckerman said. He added that "sharing disinformation about voting is an attack on our democracy."
  • 11
  • It has been easier for tech companies to act on matters of public health than on politics. Corporate reactions to Trump and his supporters have led to angry cries of censorship. Such actions even drew criticism from European leaders such as German Chancellor Angela Merkel, and she has little love for Trump.
  • 12
  • Merkel's spokesman, Steffen Seibert, said freedom of opinion is one of our most basic rights. He told reporters that such a right can only be removed or changed by governments, not by "a decision by the management of social media platforms."
  • 13
  • That may be possible in Europe, but it is much more complex in the U.S., where the First Amendment of the U.S. Constitution protects freedom of expression from government rules. However, it does not protect freedom of expression from corporate rules on privately-owned, communication platforms.
  • 14
  • Governments, of course, remain free to regulate tech companies. Over the past year, Trump, other Republicans and some Democrats have called for the removal of a 1996 law known as Section 230. The law protects social media platforms from being sued for a lot of money by anyone who feels wronged by something someone else has posted.
  • 15
  • Still, few are happy with the often slow reactions of companies like Twitter and Facebook to events like the U.S. Capitol attack, other violent events or live-streamed shootings.
  • 16
  • Sarita Schoenebeck is a University of Michigan professor who studies online harassment. She said it might be time for the platforms to reexamine how they react to problematic material.
  • 17
  • Until recently, tech companies have looked only at problematic material on its own. They have not thought about "the broader social and cultural" effect, she said. She added that companies should look at democratic ideals, community governance and platform rules to "shape behavior."
  • 18
  • Jared Schroeder is an expert on social media and the First Amendment at Southern Methodist University. He thinks the Trump bans will push supporters to more secretive platforms where "they can organize and communicate."
  • 19
  • "The bans have taken away the best tools for organizing people and for Trump to speak to the largest audiences, but these are by no means the only tools," Schroeder said.
  • 20
  • I'm Susan Shand.
  • 21
  • The Associated Press reported this story. Susan Shand adapted it for Learning English. Bryan Lynn was the editor.
  • 22
  • ________________________________________________________________
  • 23
  • Words in This Story
  • 24
  • platform - n. something that allows someone to tell a large number of people about an idea, product,
  • 25
  • absolutism - n. a philosophy of never making exceptions
  • 26
  • conspiracy - n. a secret plan made by two or more people to do something that is harmful or illegal
  • 27
  • censorship - n. a system of examining books, movies, letters, etc., and removes things that are considered to be offensive, immoral, harmful to society
  • 28
  • management - n. the control or organization of something
  • 29
  • regulate - v. to make rules or laws that control something
  • 30
  • sue - v. to use a legal process by which you try to get a court of law to force a person, company, or organization that has treated you unfairly or hurt you in some way to give you something or to do something
  • 31
  • live-stream - v. to put on the internet pictures of events as they happen
  • 32
  • harassment - n. behavior that annoys or troubles someone
  • 33
  • We want to hear from you. Write to us in the Comments Section, and visit our Facebook page.